Conversation
|
Important Review skippedDraft detected. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
| * - TModel: The specific model name (e.g., 'liveAPI-1') | ||
| * - TProviderOptions: Provider-specific options (already resolved) | ||
| */ | ||
| export interface LiveAPIAdapter< |
There was a problem hiding this comment.
I'd call it RealtimeAdapter
There was a problem hiding this comment.
Do you think just the ones in activities module or also the ones for gemini specific inside ai-gemini module?
There was a problem hiding this comment.
so for example this inheritance class: https://github.com/TanStack/ai/pull/189/changes#diff-4092139804ff5293f4635d3c7ef1b5223dda315f2a6ccc686b4b8c010788d876R35
should rather be GeminiLiveAPIAdapter or GeminiRealtimeAdapter?
Personally I would use Realtime everywhere to keep semantics the same
|
Hi, I was working on something similar. How do you plan to add connection parts between server and client parts? I was trying to add a websocket connection between client and server so that client can listen audio from user, send it to server and play the incoming audio. Are you planning on adding something similar? |
I was thinking about adding support for websockets or webrtc connection, but for now I would focus on adding adapters for high level integrations like |
|
Do we expect that other vendors will create symmetric "live" APIs? I'm hesitant to put a feature in that is specific to only one provider. |
Currently openai and google have such models, so for both of them |
|
Cool. Can you add the one for openai then? |
🎯 Changes
This PR introduces Live API adapters setup.
✅ Checklist
pnpm run test:pr.🚀 Release Impact